Approximate Inference in Probabilistic Graphical Models with Determinism
نویسنده
چکیده
In the proposed thesis, we study a special class of belief networks which contain both probabilistic and deterministic information. Deterministic information occurs as zero probabilities in the belief network. A majority of the work in the belief network community (see for example papers in conferences like UAI, AAAI, IJCAI and NIPS) addresses probabilistic inference tasks under the assumption that the underlying joint distribution represented by the belief network is strictly positive i.e. devoid of any determinism. The positivity assumption is problematic because (a) modeling many real-world problems such as genetic linkage analysis (Fishelson & Geiger 2003) requires that the inference method reason with both probabilistic and deterministic information and (b) inference is harder in presence of determinism or extreme probabilities (Dagum & Luby 1993). The purpose of the proposed thesis is to study both the representational and algorithmic issues involved in modeling deterministic information along with the usual probabilistic information in a belief network.
منابع مشابه
Rule-based joint fuzzy and probabilistic networks
One of the important challenges in Graphical models is the problem of dealing with the uncertainties in the problem. Among graphical networks, fuzzy cognitive map is only capable of modeling fuzzy uncertainty and the Bayesian network is only capable of modeling probabilistic uncertainty. In many real issues, we are faced with both fuzzy and probabilistic uncertainties. In these cases, the propo...
متن کاملGiSS: Combining Gibbs Sampling and SampleSearch for Inference in Mixed Probabilistic and Deterministic Graphical Models
Mixed probabilistic and deterministic graphical models are ubiquitous in real-world applications. Unfortunately, Gibbs sampling, a popular MCMC technique, does not converge to the correct answers in presence of determinism and therefore cannot be used for inference in such models. In this paper, we propose to remedy this problem by combining Gibbs sampling with SampleSearch, an advanced importa...
متن کاملInference by Reparameterization in Neural Population Codes
Behavioral experiments on humans and animals suggest that the brain performs probabilistic inference to interpret its environment. Here we present a new generalpurpose, biologically-plausible neural implementation of approximate inference. The neural network represents uncertainty using Probabilistic Population Codes (PPCs), which are distributed neural representations that naturally encode pro...
متن کاملLearning with Graphical Models
Graphical models provide a powerful framework for probabilistic modelling and reasoning. Although theory behind learning and inference is well understood, most practical applications require approximation to known algorithms. We review learning of thin junction trees–a class of graphical models that permits efficient inference. We discuss particular cases in clique graphs where exact inference ...
متن کاملA Hybrid Approach for Probabilistic Inference using Random Projections
We introduce a new meta-algorithm for probabilistic inference in graphical models based on random projections. The key idea is to use approximate inference algorithms for an (exponentially) large number of samples, obtained by randomly projecting the original statistical model using universal hash functions. In the case where the approximate inference algorithm is a variational approximation, t...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
دوره شماره
صفحات -
تاریخ انتشار 2007